Look-Back and Look-Ahead in the Conversion of Hidden Markov Models into Finite State Transducers
نویسنده
چکیده
This paper describes the conversion of a Hidden Markov Model into a finite state transducer that closely approximates the behavior of the stochastic model. In some cases the transducer is equivalent to the HMM. This conversion is especially advantageous for partof-speech tagging because the resulting transducer can be composed with other transducers that encode correction rules for the most frequent tagging errors. The speed of tagging is also improved. The described methods have been implemented and successfully tested.
منابع مشابه
Lecture 8: Speech Recognition Using Finite State Transducers
In order to use HTK-trained speech recognition models with the AT&T speech recognition search engine, three types of conversion are necessary. First, you must convert the HTK-format hidden Markov models into ATT format acoustic models. Second, you’ll need to write finite state transducers for the language model, dictionary, and context dependency transducer. Third, acoustic feature files need t...
متن کاملFinite State Transducers Approximating Hidden Markov Models
This paper describes the conversion of a Hidden Markov Model into a sequential transducer that closely approximates the behavior of the stochastic model. This transformation is especially advantageous for part-of-speech tagging because the resulting transducer can be composed with other transducers that encode correction rules for the most frequent tagging errors. The speed of tagging is also i...
متن کاملFinite State
This paper describes the conversion of a Hidden Markov Model into a sequential transducer that closely approximates the behavior of the stochastic model. This transformation is especially advantageous for part-of-speech tagging because the resulting transducer can be composed with other transducers that encode correction rules for the most frequent tagging errors. The speed of tagging is also i...
متن کاملRelative Entropy Rate between a Markov Chain and Its Corresponding Hidden Markov Chain
In this paper we study the relative entropy rate between a homogeneous Markov chain and a hidden Markov chain defined by observing the output of a discrete stochastic channel whose input is the finite state space homogeneous stationary Markov chain. For this purpose, we obtain the relative entropy between two finite subsequences of above mentioned chains with the help of the definition of...
متن کاملTable-driven look-ahead lexical analysis
Modern programming languages use regular expressions to define valid tokens. Traditional lexical analyzers based on minimum deterministic finite automata for regular expressions cannot handle the look-ahead problem. The scanner writer needs to explicitly identify the look-ahead states and code the buffering and re-scanning operations by hand. We identify the class of finite look-ahead finite au...
متن کامل